Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: improve contours caching calculation #3243

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from
Draft

Conversation

lpatiny
Copy link
Member

@lpatiny lpatiny commented Sep 13, 2024

No description provided.

@lpatiny lpatiny marked this pull request as draft September 13, 2024 08:55
Copy link

cloudflare-workers-and-pages bot commented Sep 13, 2024

Deploying nmrium with  Cloudflare Pages  Cloudflare Pages

Latest commit: 7390bed
Status: ✅  Deploy successful!
Preview URL: https://191b937c.nmrium.pages.dev
Branch Preview URL: https://2024-09-13-10-53.nmrium.pages.dev

View logs

@lpatiny lpatiny changed the title wip: testing conrect wip: testing conrec Sep 13, 2024
@lpatiny
Copy link
Member Author

lpatiny commented Sep 13, 2024

@hamed-musallam This is the POC of the cache. I just quickly did it as a global variable but it should of course be done at the level of each spectra using I guess a useRef.

I also did a small fix in ml-conrec to improve performances when nothing should be calculated.
@hamed-musallam Could you take over this PR or use it in the other PR ?
You can also be interesping to keep an eye on the memory using a logger.trace info with the total sum of all the lines in the cache.

Once done we will ask Michael to check if it is ok.

@hamed-musallam hamed-musallam changed the title wip: testing conrec feat: improve contours caching calculation Sep 16, 2024
@lpatiny
Copy link
Member Author

lpatiny commented Sep 16, 2024

If a check the size of the cache

  for (const level of _range) {
    console.log(level, cache.has(level), cache.size);
    if (cache.has(level)) {

I read 414. There could be an issue with rounding calculating the different levels ? I think calculate the possible levels should be done once for ever per spectrum to avoid this issue.

This was done with 2D cosy
image

image

@lpatiny lpatiny requested a review from targos October 8, 2024 20:17
@lpatiny
Copy link
Member Author

lpatiny commented Oct 8, 2024

@targos Here are some comments / ideas.

If both positive and negative curves need to be calculated we should think about doing both at once. It seems much faster.

In conrec it would be better to have a limit in the number of segment per layer. Currently we have only a global timeout and if this timeout is reached nothing will be displayed. This could also prevent some crash with out of memory despite the fact that we are using typed array in order to limit this risk.

We could think about having fifo-logger sent to conrec to get some debug information as well as the error if the timeout was reached.

Conrec could work in a worker. There are only 101 levels possible for positive and negative. We could start calculations for the 10 default levels and possibly start in a webworker the calculation of the other 91 levels. Because we are using typedarray the transfer of the data should be fast.

We could still make some profiling to determine the limiting speed step. We also need to be certain that only 101 levels are practically calculated.

The goal is to be able to vertically scale smoothly the following spectrum (from Perth)

3.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants